ChatGPT for middle and high school students
2023-03-28
GPT-4 is a state-of-the-art artificial intelligence developed by OpenAI. This AI can do many things. We encourage you to see for yourself what it can do.
However, to use the GPT-4, you must pay $20 per month by credit card. For this reason, some elementary, middle, and high school students and technical college students may be interested but not able to experience it because their parents do not understand. This is a lost opportunity.
Therefore, I would like to offer junior high and high school students throughout Japan the opportunity to experience the GPT-4. The budget is 250,000 yen to start. This will allow about 100 students for subscription and about 10,000 questions for API use.
Let me introduce myself a little. I am a mentor for unexplored junior and I look through more than 100 applications every year. Among them are several that have "what I want to do" but are having trouble knowing "how to take the first step". They may not have an adult close by whom they can ask for technical advice. It is tempting to do something to help them, but it is difficult to support them by adopting their ideas through the Unexplored Junior Program, where the competition is 10 times higher than the competition. GPT-4 is a technology that can solve this problem. If you don't know what exactly you need to do to achieve "what you want to do," you should consult GPT-4. a Google search for "what you want to do" doesn't yield very good results, but GPT4 offers a variety of suggestions. You can notice approaches you weren't aware of, or learn about keywords you didn't know existed. You may even find better knowledge by doing a Google search for the keywords you learned about.
Of course, there can be errors mixed in with the GPT-4 answers. That's just like Google search results can be mixed with mistakes, just like the advice of the adults around you can be mixed with mistakes. You are the one who decides what you do.
2023-07-08
We finished with our budget exhausted: a total of 55603 messages in 4295 rooms for a total of 1916 users, many times the number of opportunities we had originally planned.
code::
Number of unique users: 1916
Number of rooms: 4295
Number of messages: 55603
Services.
The chat logs are stored on the server and I can read them. I believe this is necessary to observe usage and make it better.
I may edit and publish content that I deem more beneficial to share with others.
Rates.
Up to 12 years old: Use with adults (free)
OpenAI's license restrictions prohibit use by anyone under the age of 13. Please keep an adult by your side and use the "adult-using" form.
13 to 17 years old: Free
Under OpenAI's license, you must get permission from a parent or legal guardian. Say something like, "It's free, so you can try it out."
Using this service to create an example of "what you can do with GPT4" would be a good way to convince parents about the use of this family's billing plan.
18+: You may use it, but you must pay Nishio for it in some way.
actual example.
Output from this service
user
I want to connect a plant to a musical instrument so that the plant can play. What are some ways to do this? Briefly explain in bullet points.
assistant
1. MIDI Sprouter: Converts bioelectrical signals of plants into MIDI data and makes them play musical instruments.
2. touch sensor: Touch sensors are attached to the leaves and stems of plants to control instruments that produce sound when touched.
3. acoustic vibration: picks up minute vibrations caused by plant growth and absorption of water and converts them into sound.
4. light sensor: detects changes in light caused by plant shadows and generates sound.
5. plant growth data: record the growth process of plants and convert the data into music.
user
3 in detail
assistant
Plant performance by acoustic vibration:.
This method converts the minute vibrations caused by plant growth and absorption of water into sound. The following procedure can be used to achieve this.
1. piezoelectric sensors (piezoelectric sensors) are used to detect minute vibrations generated by plant stems and leaves. These sensors can convert pressure and vibration into electrical signals.
2. amplify the collected electrical signals with an amplifier and transmit them to the computer through the audio interface.
3. use music software (DAW) on a computer to analyze the collected vibration data and convert it into sound. At this point, the frequency and amplitude can be adjusted and processed into an audible sound.
4. play the converted sound through speakers or instruments (synthesizer, piano, etc.). This produces an effect as if the plant were playing.
Using this method, one can enjoy the minute vibrations caused by plant growth and absorption of water as music. It is also possible to observe changes in sound as a result of changes in the condition of the plants and their environment.
nishio.iconComment
I don't know if piezoelectric sensors are accurate enough to do this.
It's worth it to realize that the problem might be solvable that way."
Because once you realize it, you can look into it, test it, and develop it!
I thought it would be interesting to combine moving plants like morning glories with "4. light sensor".
If you make a lot of small tiles that make sounds when the brightness changes, you might be able to do some interesting things with the combination of the tiles.
Maybe a small animal instead of a plant would be interesting?
I noticed in "5. Plant growth data" that when I asked this question I implicitly thought "real time". But it would be interesting to have a time-lapse video of the growth process with a song to accompany the growth.
Other
The description was too cumbersome to follow, so I changed it to a shell script.
Example of coaching an essay
https://gyazo.com/7641cdbd928995d255b82752edd676e7
I'll change my icon next time. w
Questions/Consultation
nishio.hirokazu@gmail.com
to schoolteachers
Please understand that although the mechanism could be used extensively within a school, it is a negative to the goal of providing opportunities to a wide range of people if it eats up the budget for use in a particular school.
Please feel free to use it as a tool for pilot programs, such as making use cases and getting budgets. I think we all share the same goal of "giving kids a better opportunity."
"Hide log" function 2023-03-29
ChatGPT has no memory of chat exchanges and continues the conversation by sending the chat logs up to now each time.
Since there is a limit to the amount of logs that can be read, many implementations simply discard the oldest ones when they are full.
I think this spec makes it hard to understand that I have "limited memory", so I'd like to be able to specify explicitly what I'm discarding. (The checkbox next to a post is for that purpose.) But since it is not implemented yet, I have added the ability to hide all the logs together for now.
how it started
Q: I started getting errors when my tokens are full.
https://gyazo.com/822d6f7e306623361a6b3428644aeb44
A: It means that ChatGPT silently excludes logs from being sent when the limit of reading logs is reached, but the current implementation of this is not done. I will fix it tomorrow. How to fix it is tricky, I think it would be better to know explicitly when the logs disappear...
In the meantime, I've added a "Clear Log" button, so if it fills up, please delete it.
Room creation function 2023-04-04
Names could be improved later.
Discussion-wise, GPT reads all the logs before answering. But there are times when you don't need to read the logs, such as when the topic changes. That's when you can create a "new room".
Wasting a lot of logs takes extra time and money.
With this feature, I think the "hide logs" feature above is no longer of any use.
"Role Setting" function2023-04-03
https://gyazo.com/037c555c474233f8023346dd5d5b10dd
You can set what to put in role: "system" in OpenAI API. If you are not sure, there is no need to tamper with it.
@nishio: System Role can also be edited at any time. This is just to say that you can tinker with it if you are interested in doing so, so if you don't know what it is, you can leave it as is. listen-in function2023-04-03
@nishio: you can now choose whether to ask GPT3.5 or GPT4. Now you can use it in such a way that you can first talk to the cheap and responsive GPT3.5 about various things and sort out what you want to ask, and then ask the smart GPT4 to answer based on the logs. https://gyazo.com/1c4a16ae8f9ab16644b71309851c06cf
I really wanted to give a proper consumption amount, but it was more difficult to implement than I expected, so I am using a rough estimate.
150 per dollar for GPT-4, $36 if the context is 8K and all Prompt, $72 for Completion, in fact both the dollar rate and the percentage of Prompt will fluctuate. https://openai.com/pricing @nishio: now you can do Japanese/English/English-Japanese! https://gyazo.com/233a64ba6c206d0af4fd52fcc2580708
Other
@nishio: The reason why I suddenly added the "melody when GPT4 finishes outputting" is so that you don't have to look at the screen and so that GPT4 can train your muscles while you are writing code. I added this function so that you don't have to look at the screen and you can train your muscles while GPT4 is writing code. @nishio: the ability to hear a sound when the GPT4 output is finished, how convenient to notice it even if you have another tab open and are working on it. @nishio: I just realized that if I made it possible for people to put in their own API key and use it, would adults who think "I'd like to use it, but they say it's for middle and high school students I don't know how to make a payment yet. @nishio: now you can use your own API key! https://pbs.twimg.com/media/Fs7EJPQagAACSf7.jpg
2023-04-28
https://gyazo.com/8bdaaf6a0f82fd72bb9df64b124f530d
One month after release, the cost was roughly 100,000 yen. It looks like we can continue for another 1.5 months.
https://gyazo.com/6ec92a0c92736cfd570c5ea8cefc3757
2023-06-01
https://gyazo.com/0ca9658f24fe2d5825a8e9554278ba6b
2023-08-03
https://gyazo.com/030c1433c95e1377401a27290dd5f9df
https://gyazo.com/1f1fba853507980c9e66f736dc4efe4c
---
This page is auto-translated from /nishio/中高生のためのChatGPT using DeepL. If you looks something interesting but the auto-translated English is not good enough to understand it, feel free to let me know at @nishio_en. I'm very happy to spread my thought to non-Japanese readers.